200 research outputs found

    Nonuniform Sparse Recovery with Subgaussian Matrices

    Full text link
    Compressive sensing predicts that sufficiently sparse vectors can be recovered from highly incomplete information. Efficient recovery methods such as β„“1\ell_1-minimization find the sparsest solution to certain systems of equations. Random matrices have become a popular choice for the measurement matrix. Indeed, near-optimal uniform recovery results have been shown for such matrices. In this note we focus on nonuniform recovery using Gaussian random matrices and β„“1\ell_1-minimization. We provide a condition on the number of samples in terms of the sparsity and the signal length which guarantees that a fixed sparse signal can be recovered with a random draw of the matrix using β„“1\ell_1-minimization. The constant 2 in the condition is optimal, and the proof is rather short compared to a similar result due to Donoho and Tanner

    Compressive sensing Petrov-Galerkin approximation of high-dimensional parametric operator equations

    Full text link
    We analyze the convergence of compressive sensing based sampling techniques for the efficient evaluation of functionals of solutions for a class of high-dimensional, affine-parametric, linear operator equations which depend on possibly infinitely many parameters. The proposed algorithms are based on so-called "non-intrusive" sampling of the high-dimensional parameter space, reminiscent of Monte-Carlo sampling. In contrast to Monte-Carlo, however, a functional of the parametric solution is then computed via compressive sensing methods from samples of functionals of the solution. A key ingredient in our analysis of independent interest consists in a generalization of recent results on the approximate sparsity of generalized polynomial chaos representations (gpc) of the parametric solution families, in terms of the gpc series with respect to tensorized Chebyshev polynomials. In particular, we establish sufficient conditions on the parametric inputs to the parametric operator equation such that the Chebyshev coefficients of the gpc expansion are contained in certain weighted β„“p\ell_p-spaces for 0<p≀10<p\leq 1. Based on this we show that reconstructions of the parametric solutions computed from the sampled problems converge, with high probability, at the L2L_2, resp. L∞L_\infty convergence rates afforded by best ss-term approximations of the parametric solution up to logarithmic factors.Comment: revised version, 27 page
    • …
    corecore